Long-Term Visual Object Tracking via Continual Learning
نویسندگان
چکیده
منابع مشابه
Long-Term Visual Object Tracking Benchmark
In this paper, we propose a new long video dataset (called Track Long and Prosper TLP) and benchmark for visual object tracking. The dataset consists of 50 videos from real world scenarios, encompassing a duration of over 400 minutes (676K frames), making it more than 20 folds larger in average duration per sequence and more than 8 folds larger in terms of total covered duration, as compared to...
متن کاملVisual Object Tracking via One-Class SVM
In this paper, we propose a new visual object tracking approach via one-class SVM (OC-SVM), inspired by the fact that OC-SVM’s support vectors can form a hyper-sphere, whose center can be regarded as a robust object estimation from samples. In the tracking approach, a set of tracking samples are constructed in a predefined searching window of a video frame. And then a threshold strategy is prop...
متن کاملVisual Learning in Multiple-Object Tracking
BACKGROUND Tracking moving objects in space is important for the maintenance of spatiotemporal continuity in everyday visual tasks. In the laboratory, this ability is tested using the Multiple Object Tracking (MOT) task, where participants track a subset of moving objects with attention over an extended period of time. The ability to track multiple objects with attention is severely limited. Re...
متن کاملReal-time Visual Object Tracking via CamShift-Based Robust Framework
In recent years, plenty of object tracking methods have been put forward for better tracking accuracies. However, few of them can be applied to the real-time applications due to high computational cost. In this paper, aiming at achieving better real-time tracking performance, we propose an adaptive robust framework for object tracking based on the CamShift approach, which is notable for its sim...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Access
سال: 2019
ISSN: 2169-3536
DOI: 10.1109/access.2019.2960321